# Unsupervised Contrastive Learning
Nomic Embed Text V2 Moe Unsupervised
This is an intermediate version of a multilingual Mixture of Experts (MoE) text embedding model, obtained through multi-stage contrastive training
Text Embedding
N
nomic-ai
161
5
Llm2vec Meta Llama 31 8B Instruct Mntp Unsup Simcse
MIT
LLM2Vec is a solution for converting decoder-only large language models into text encoders, achieved by enabling bidirectional attention, masked next-word prediction, and unsupervised contrastive learning.
Text Embedding English
L
McGill-NLP
55
2
Llm2vec Meta Llama 31 8B Instruct Mntp
MIT
LLM2Vec is a simple method to convert decoder-only large language models into text encoders by enabling bidirectional attention, masked next-token prediction, and unsupervised contrastive learning.
Text Embedding
Transformers English

L
McGill-NLP
386
2
Llm2vec Meta Llama 3 8B Instruct Mntp
MIT
LLM2Vec is a simple solution for converting decoder-only large language models into text encoders, achieved by enabling bidirectional attention mechanisms, masked next-token prediction, and unsupervised contrastive learning.
Text Embedding
Transformers English

L
McGill-NLP
3,885
16
Llm2vec Sheared LLaMA Mntp
MIT
LLM2Vec is a simple solution for transforming decoder-only large language models into text encoders, achieved by enabling bidirectional attention, masked next-token prediction, and unsupervised contrastive learning.
Text Embedding
Transformers English

L
McGill-NLP
2,430
5
Simcse Dist Mpnet Paracrawl Cs En
A Czech-English semantic embedding model fine-tuned with SimCSE objective, based on Seznam/dist-mpnet-paracrawl-cs-en
Text Embedding
Transformers Supports Multiple Languages

S
Seznam
2,997
3
Simcse Small E Czech
A Czech sentence similarity model fine-tuned with SimCSE objective based on Seznam/small-e-czech model
Text Embedding
Transformers Other

S
Seznam
1,543
1
Rankcse Listmle Bert Base Uncased
Apache-2.0
This dataset is used for training and evaluating the SimCSE (Simple Contrastive Learning of Sentence Embeddings) model, supporting sentence similarity tasks.
Text Embedding
Transformers English

R
perceptiveshawty
20
0
Erlangshen SimCSE 110M Chinese
Apache-2.0
A Chinese sentence vector representation model based on the unsupervised version of SimCSE, trained with supervised contrastive learning using Chinese NLI data
Text Embedding
Transformers Chinese

E
IDEA-CCNL
186
21
Declutr Sci Base
Apache-2.0
SciBERT-based scientific text sentence encoder, trained on 2 million scientific papers through self-supervised learning
Text Embedding English
D
johngiorgi
50
9
Featured Recommended AI Models